Divergence and Sufficiency for Convex Optimization

نویسنده

  • Peter Harremoës
چکیده

Logarithmic score and information divergence appear in both information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to the use of Bregman divergences. If the Bregman divergence also fulfills a sufficiency condition it must be proportional to information divergence. We will demonstrate that sufficiency is equivalent to the apparently weaker notion of locality and it is also equivalent to the apparently stronger notion of monotonicity. These sufficiency conditions have quite different relevance in the different areas of application, and often they are not fulfilled. Therefore sufficiency conditions can be used to explain when results from one area can be transferred directly to another and when one will experience differences.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...

متن کامل

f-DIVERGENCES: SUFFICIENCY, DEFICIENCY AND TESTING OF HYPOTHESES

This paper deals with f -divergences of probability measures considered in the same general form as e.g. in [12] or [45], where f is an arbitrary (not necessarily differentiable) convex function. Important particular cases or subclasses are mentioned, including those introduced by Bhattacharyya [3], Kakutani [32], Shannon [61] and Kullback with Leibler [38], Chernoff [7], Kraft [37], Matusita [...

متن کامل

Sufficiency and Jensen's Inequality for Conditional Expectations

For finite sets of probability measures, sufficiency is characterized by means of certain positively homogeneous convex functions. The essential tool is a discussion of equality in Jensen's inequality for conditional expectations. In particular, it is shown that characterizations of sufficiency by Csiszhr's f-divergence (1963, Publ. Math. Inst. Hung. Acad. Sci. Ser. A, 8, 85-107) and by optimal...

متن کامل

Generic identifiability and second-order sufficiency in tame convex optimization

We consider linear optimization over a fixed compact convex feasible region that is semi-algebraic (or, more generally, “tame”). Generically, we prove that the optimal solution is unique and lies on a unique manifold, around which the feasible region is “partly smooth”, ensuring finite identification of the manifold by many optimization algorithms. Furthermore, second-order optimality condition...

متن کامل

Sufficiency and duality for a nonsmooth vector optimization problem with generalized $alpha$-$d_{I}$-type-I univexity over cones‎

In this paper, using Clarke’s generalized directional derivative and dI-invexity we introduce new concepts of nonsmooth K-α-dI-invex and generalized type I univex functions over cones for a nonsmooth vector optimization problem with cone constraints. We obtain some sufficient optimality conditions and Mond-Weir type duality results under the foresaid generalized invexity and type I cone-univexi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 19  شماره 

صفحات  -

تاریخ انتشار 2017